The new features of AbiPy v0.9.1¶

M. Giantomassi and the AbiPy group¶

10th international ABINIT developer workshop
May 31 - June 4, 2021 - Smart Working, Lockdown@BE


  • These slides have been generated using jupyter, nbconvert and revealjs
  • The notebook can be downloaded from this github repo
  • To install and configure the software, follow these installation instructions

Use the Space key to navigate through all slides.

What is AbiPy?¶

Python package for:¶

  • Generating ABINIT input files automatically
  • Post-processing results extracted from netcdf or text files
  • Interfacing ABINIT with external tools such as vesta, wannier90, lobster, etc.
  • Executing ABINIT-specific workflows on laptops as well as on HPC clusters

Dependencies:¶

  • Hard deps: pymatgen, netcdf4, matplotlib, plotly, numpy, scipy, pandas, ipython …
  • Soft deps: ASE, phonopy, jupyter …

NB: AbiPy can be interfaced with other packages (e.g ASE, phonopy) via converters.

What's new at the level of the installation procedure?¶

  • AbiPy now requires python >= 3.7
  • conda packages for AbiPy and ABINIT are now provided by conda-forge therefore:
    • new pkgs are automatically built when new releases are pushed to github
    • the abiconda channel is deprecated. Please use conda-forge.
  • AbiPy v1.0 will drop support for Abinit8
  • We plan to provide new AbiPy/ABINIT recipes for EasyBuild and Spack
  • We seek for volunteers to support other package managers (homebrew, apt-get, etc)

How to install AbiPy¶

Using conda and the conda forge channel (recommended):

    conda install abipy --channel conda-forge

Since conda is not limited to python packages, one can install ABINIT in the same env with:

    conda install abinit -c conda-forge


Using pip and python wheels:

    pip install abipy --user

From the github repository:

    git clone https://github.com/abinit/abipy.git
    cd abipy 
    python setup.py install # or `develop` for development


For further info see http://abinit.github.io/abipy/installation.html

What's new at the level of the documentation?¶

  • New website based on sphinx-rtd-theme
  • Gallery of plots with matplotlib and plotly examples (59 scripts in total)
  • New examples for Flows (38 scripts in total)
  • Each example can be executed inside a Docker container on mybinder.org

Feel free to ask for more tutorials/examples¶

In [5]:
#%embed https://abinit.github.io/abipy/gallery/index.html
#%embed https://abinit.github.io/abipy/flow_gallery/index.html

What's new at the level of workflow infrastructure?¶

In addition to the standard workflows/tools for GS, DFPT, $GW$, IP optics, Bethe-Salpeter, we have:

  • New Flows and improved post-processing tools for:

    • elastic and piezoelectric tensors with DFPT (clamped/relaxed atoms)
    • non-linear optical properties (SHG) with DFPT.
    • Gruneisen parameters with DFPT + finite differences
    • quasi-harmonic approximation
    • effective masses with DFPT or finite differences
    • e-ph self-energy, e-ph matrix elements and scattering potentials
    • phonon-lmited transport properties
  • Python converters: DDB $\;\rightleftarrows \;$ phonopy / tdep

Overview of new Flows¶

⚠️ There are two different workflow infrastructures:¶

Internal AbiPy implementation (abipy.flowtk modules):¶

  • ✅ Lightweigth, no database required
  • ✅ Designed for rapid prototyping and/or for testing advanced ABINIT capabilities
  • ❌ No explicit support for high-throughput (HT) applications.

AbiFlows package (requires Fireworks and MongoDB database)¶

  • ✅ HT-oriented: can use MongoDB to store workflow status and final results for further analysis
  • ✅ High-level API designed for HT applications (e.g. phonon calculations for the materials project
  • ❌ New ABINIT features are first implemented/tested with flowtk and then ported to AbiFlows.

NB: There's an ongoing effort to reimplement AbiFlows in terms of the atomate framework. In this talk, we will mainly discuss the new features available in abipy.flowtk.

ElasticWork¶

This flow computes:¶

  • the rigid-atom elastic tensor
  • the rigid-atom piezoelectric tensor (insulators only)
  • the internal strain tensor
  • the atomic relaxation corrections to the elastic and piezoelectric tensor

For the formalism, see: XXX

Python example¶

scf_input = make_scf_input() # Build input for GS calculation

elast_work = flowtk.ElasticWork.from_scf_input(scf_input, 
                                               with_relaxed_ion=True, 
                                               with_piezo=True)

Dependency Graph:¶


  • Indipendent perturbations are computed in parallel with optimized MPI-params.
  • Restart capabilies (timeout limit) and error handlers
  • Intermediate DDB files are automatically merged.
  • Final DDB file automatically produced in the outdata directory of the Work.

To compute the elastic tensors from the final DDB file, use:¶

In [7]:
elastic_ddb = abilab.abiopen("elastic_DDB")
edata = elastic_ddb.anaget_elastic()

edata.get_elastic_properties_dataframe(properties_as_index=True)
Out[7]:
property 0 1
0 trans_v 3.194459e+03 3.838052e+03
1 long_v 5.796035e+03 6.295200e+03
2 snyder_ac 5.767044e+01 8.693459e+01
3 snyder_opt 3.158141e-01 3.621134e-01
4 snyder_total 5.798626e+01 8.729670e+01
5 clarke_thermalcond 7.734051e-01 9.006348e-01
6 cahill_thermalcond 8.539415e-01 9.791319e-01
7 debye_temperature 3.760623e+02 4.477874e+02
8 k_voigt 7.553865e+01 7.553930e+01
9 k_reuss 7.553074e+01 7.553579e+01
10 k_vrh 7.553469e+01 7.553754e+01
11 g_voigt 3.951341e+01 5.767416e+01
12 g_reuss 3.761300e+01 5.366047e+01
13 g_vrh 3.856321e+01 5.566731e+01
14 universal_anisotropy 2.527308e-01 3.740360e-01
15 homogeneous_poisson 2.818554e-01 2.041909e-01
16 y_mod 9.886491e+10 1.340681e+11
In [8]:
# Pandas DataFrame with relaxed-atoms elastic tensor.
edata.elastic_relaxed
Out[8]:
xx yy zz yz xz xy
Voigt index
xx 135.262182 54.450376 38.052927 0.00000 0.000000 0.000000
yy 54.450376 135.262181 38.052927 0.00000 0.000000 0.000000
zz 38.052927 38.052926 148.211029 0.00000 0.000000 0.000000
yz 0.000000 0.000000 0.000000 30.55071 0.000000 0.000000
xz 0.000000 0.000000 0.000000 0.00000 30.550709 0.000000
xy 0.000000 0.000000 0.000000 0.00000 0.000000 40.405903
For additional info, please consult this notebook tutorial¶

To generate a GS input, one can use factory functions (ideal for HT)¶

In [9]:
from abipy.abio.factories import gs_input
gs_input(structure="si.cif", pseudos="14si.pspnc", ecut=8)
Out[9]:
##############################################
#### SECTION: basic
##############################################
ecut 8
ngkpt 8 8 8
shiftk 0.5 0.5 0.5
nshiftk 1
kptopt 1
nsppol 2
nband 16
occopt 3
tolvrs 1e-08
##############################################
#### SECTION: gstate
##############################################
spinat
0.0 0.0 0.6
0.0 0.0 0.6
chksymbreak 0
nspinor 1
nspden 2
charge 0.0
tsmear 0.0036749322175655 Ha
##############################################
#### STRUCTURE
##############################################
natom 2
ntypat 1
typat 1 1
znucl 14
xred
0.0000000000 0.0000000000 0.0000000000
0.2500000000 0.2500000000 0.2500000000
acell 1.0 1.0 1.0
rprim
6.3285005244 0.0000000000 3.6537614813
2.1095001748 5.9665675141 3.6537614813
0.0000000000 0.0000000000 7.3075229627

or build the GS input explicitly if full controll is wanted:¶

In [10]:
multi = abilab.ebands_input(structure="si.cif", 
                            pseudos="14si.pspnc",
                            ecut=8, 
                            spin_mode="unpolarized", 
                            smearing=None, 
                            dos_kppa=5000)

multi.get_vars_dataframe("kptopt", "iscf", "ngkpt")
Out[10]:
kptopt iscf ngkpt
dataset 0 1 None [8, 8, 8]
dataset 1 -11 -2 None
dataset 2 1 -2 [14, 14, 14]

The mode Gr\"uneisen parameters are the logarithmic derivatives of the phonon frequencies with respect to the volume $V$:

$ \renewcommand{\qq}{{\mathbf{q}}} \newcommand{\qnu}{{\qq\nu}} \newcommand{\wqnu}{{\omega_\qnu}} \newcommand{\PDER}[2]{\dfrac{\partial #1}{\partial #2}} \newcommand{\eqnu}{{\epsilon_\qnu}} $

Gruneisen parameters with finite differences¶

This flow computes:

\begin{equation*} \newcommand{\PDER}[2]{\dfrac{\partial #1}{\partial #2}} \gamma_{\mathbf{q}\nu} = -\PDER{\ln \wqnu}{\ln V} = -\dfrac{V}{\wqnu}\,\PDER{\wqnu}{V} \end{equation*}

where $D(\mathbf{q})\,{\epsilon_{\mathbf{q}\nu}} = \omega^2_s(\mathbf{q})\,\epsilon_{\mathbf{q}\nu}$ with $D$ the dynamical matrix.

The derivative wrt $V$ can be rewritten as:

\begin{equation*} \PDER{\omega^2(\mathbf{q}\nu)}{V} = 2\wqnu\,\PDER{\wqnu}{V} \langle \eqnu | \PDER{D(\qq)}{V} | \eqnu \rangle \end{equation*}

where the r.h.s is computed by anaddb using finite differences of DFPT results.

Python example:¶

scf_input = make_scf_input() # Build input for GS calculation

from abipy.flowtk.gruneisen import GruneisenWork
voldelta = gs_inp.structure.volume * 0.02
work = GruneisenWork.from_gs_input(gs_inp, voldelta, ngqpt=[2, 2, 2], 
                                   with_becs=False)

Dependency Graph:¶


  • $N$ relaxations at fixed volume (violet task). $N$ in [3, 5, 7]
  • Each relaxation task starts a DFPT computation of the dynamical matrix $D(\mathbf{q}, V)$
  • Finally, invoke anaddb with the $N$ DDB files to produce out_GRUNS.nc

To analyze the results stored in the GRUNS.nc file, use:¶

In [13]:
gruns = abiopen("out_GRUNS.nc")
gruns.plot_gruns_scatter();

Effective masses with DFPT¶

This flow performs:¶

  • SCF followed by NSCF run along k-path to find band edges automatically
  • Compute $\epsilon^{\alpha\beta}_{n\bf{k}}$ and the effective mass tensor for these k-points from: $|u_{n\mathbf{k}}\rangle$, and the k-derivatives $|u_{n\mathbf{k}}^\alpha\rangle$ $H^\alpha_{\mathbf{k}}$, $H^{\alpha\beta}_{\mathbf{k}}$

For the formalism see J. Laflamme Janssen, et. al. Phys. Rev. B 93, 205147

Python example:¶

scf_input = make_scf_input()   # Get the SCF input (without SOC)

from abipy.flowtk.effmass_works import EffMassAutoDFPTWork
flow = flowtk.Flow("effmass_flow")

work = EffMassAutoDFPTWork.from_scf_input(scf_input)
flow.register_work(work)

Dependency Graph¶


  • SCF + NSCF along high-symmetry k-path to find band edges automatically
  • Perform DFPT computation of $m^*$ tensor for the band edges.
  • Similar Flows for effective masses with finite differences are available as well.

e-ph matrix elements along an arbitrary q-path¶

This flow computes:¶

This flow computes the e-ph matrix elements in AlAs along a q-path
The final results are stored in the GKQ.nc file (one file per q-point) in the outdata of each task.

Python example:¶

# Build input for GS calculation on a 2x2x2 k-mesh
scf_input = make_scf_input(ngkpt=(2, 2, 2))

# q-mesh for phonons
ngqpt = (2, 2, 2)

# Create flow to compute all the independent atomic perturbations
# Use ndivsm = 0 to pass an explicit list of q-points.
# If ndivsm > 0, qpath_list is interpreted as a list of boundaries for the q-path
qpath_list = [[0.0, 0.0, 0.0], [0.01, 0, 0], [0.1, 0, 0],
              [0.24, 0, 0], [0.3, 0, 0], [0.45, 0, 0], [0.5, 0.0, 0.0]]

from abipy.flowtk.eph_flows import GkqPathFlow
flow = GkqPathFlow.from_scf_input("flow_dir", scf_input,
                                  ngqpt, qpath_list, ndivsm=0, with_becs=True,
                                  ddk_tolerance={"tolwfr": 1e-8})

Formalism:

https://abinit.github.io/abipy/flow_gallery/run_gkq.html#sphx-glr-flow-gallery-run-gkq-py

Dependency Graph:¶


  • SCF + NSCF along k-path to find band edges automatically
  • Used in ...

DDB converters¶

(contributed by G. Petretto)

  • Based on previous work by H. Xue, E. Bousquet and A. Romero
  • Can be used to:

    • connect the ABINT DPT part with other packages requiring phonopy files (e.g. anharmonic calculations with hipive)
    • interface AbiPy with phonopy tools e.g. irreps
  • Algorithm:

    • Runs anaddb to get the interatomic force constants IFC($\bf{R}$), BECs and $\epsilon^\infty$
    • Save results in nc format (anaddb.nc)
    • Convert IFC($\bf{R}$) and tensors from ABINIT to phonopy conventions
    • Optionally, save phonopy files in output_dir_path.

In python, everything boils down to:

ddb = abilab.abiopen("mp-149_DDB")
phonopy_obj = ddb.anaget_phonopy_ifc(output_dir_path="output_dir")
  • Example: obtain phonon irreps from DDB using phonopy
ddb = abilab.abiopen("mp-149_DDB")
ph = ddb.anaget_phonopy_ifc(ngqpt=[1,1,1])
ph.set_irreps([0, 0, 0])
ph.get_irreps().show()

Other AbiPy applications are presented in the following talks:¶

  • Electron-phonon beyond Fröhlich: dynamical quadrupoles in polar and covalent solids by G. Brunin
  • Phonon-limited conductivity in 2D and 3D metals by O. Nadeau
  • Absorption spectrum calculations using cumulant expansion in electron-phonon interactions by J. C. Abreu
  • Automating ΔSCF computations of point defects using AbiPy workflows by J. Bouquiaux

What's new at the level of the post-processing tools?¶

  • New plotting tools based on plotly
  • GUIs and dashboards based on panel and bokeh


Why plotly?¶


  • ✅ Publication quality figures.
  • ✅ Flexible python API able to produce rather advanced plots
  • ❌ Plots are difficult to customize without changing the python code
  • ❌ Plots lacks interactivity and integration with HTML/JS



  • ✅ Interactive plots + chart editor GUI to customize the figure
  • ✅ Play well with HTML (plotly is written in js with python bindings)
  • ❌ Open source project but not all the features are available in the free plan
  • ❌ Requires browser (this may represent an issue on some HPC centers)

AbiPy plots with matplotlib¶

In [15]:
gsr = abiopen("si_nscf_GSR.nc")
gsr.ebands.plot(with_gaps=True);

AbiPy plots with plotly¶

(contributed by Y. He)

In [16]:
gsr.ebands.plotly(with_gaps=True);  # object.plot becomes object.plotly

To update the plotly figure to the chart studio server, use:¶

gsr.ebands.plotly(with_gaps=True, chart_studio=True);

Users can finally customize the AbiPy plot without changing the python code 🎉¶

Interactive 3d plots with plotly:¶

In [18]:
gsr.ebands.kpoints.plotly(title="k-path in 3d with plotly");

The matplotlib 3d plot embedded in HTML :¶

In [19]:
gsr.ebands.kpoints.plot(title="Static matplotlib figure");

Integrating Abipy with web-based technologies¶

  • Integration between panel and AbiPy
  • AbiPy GUIs inside jupyter notebooks
  • Dashboards and web apps
  • Integration with the AbiPy command line interface
  • Web apps for users (🚧)

What is Panel?¶

  • Panel provides tools for composing widgets, plots, tables into web apps and dashboards
  • It relies on the client-server model where:
    • the client is the web browser running HTML/CSS/JS code.
    • the server communicates with the client, executes python code and sends the results back to the client
  • In a nutshell, panel keeps the browser and python in synch e.g. the user clicks a button in the GUI and the signal is sent to python.

Why panel?¶

  • It works with visualization packages from Bokeh, Plotly, Matplotlib, HoloViews, and many other python plotting libraries,
  • It works equally well in Jupyter Notebooks

Pros and cons of the client-server model¶

Advantages:¶

  • The client does not need to install the scientific software stack (when running on different machines)
  • Can implement web apps that allows the user to upload data and analyze the results e.g DDB GUI.

Disavantages¶

  • Round trip delay if client != host and slow connection
  • Upoloadng a 1Gb file to the remote server just because you don't want to install software on the localhost is a very bad idea.
  • OK for relatively small files (< 1Gb) but this approach is not designed to handle big data.
  • Not all the HPC centers provide specialized nodes to post-process the results inside a web browser/notebook.
In [20]:
#gsr = abiopen("si_nscf_GSR.nc")
#abilab.abipanel();
#gsr.get_panel()

How to use AbiPy panels inside jupyter notebooks¶

To build a panel GUI inside the notebook, call the get_panel method:¶

In [21]:
ddb = abilab.abiopen("ZnSe_hex_qpt_DDB")

abilab.abipanel(); # Important
ddb.get_panel()
Out[21]:

Other AbiPy files/objects are supported as well:¶

In [22]:
ddb.structure.get_panel()
Out[22]:
Don't be surprised if you start to click buttons and nothing happens in the GUI! One needs a running python backend to execute the callbacks triggerered by the widgets.

Creating dashboards from the command line¶

  • Creating panels inside a notebook is great if you needs GUIs plus the possibility of executing python code.
  • There are however cases in which we only need a dashboard with widgets to interact with the data.

To open a dashboard for the DDB file from the shell, use the --panel option:¶

abiopen.py out_DDB --panel  # or -pn if you prefer the short option.

To produce a predefined set of matplotlib figures, use:¶

abiopen.py mgb2_kpath_FATBANDS.nc --expose --seaborn
abiopen.py mgb2_kpath_FATBANDS.nc --expose-web # -ewb
abiopen.py mgb2_kpath_FATBANDS.nc --plotly # -ply

abiopen_expose

To visualize all the results inside the browser, use¶

abiopen.py out_GRUNS.nc --expose-web # -ew for the short version

🆕 This is new feature of abiopen.py made possible by panel. See next slides.¶

Need to call anaddb to compute and visualize ph-bands and DOS from DDB?¶

abiview.py ddb ZnSe_hex_qpt_DDB --plotly # -ply

abiview_ddb_plotly

Documentation for these new features available at:¶

In [23]:
%embed https://abinit.github.io/abipy/graphical_interface.html
Out[23]:
In [24]:
#plotter = abilab.ElectronBandsPlotter()
#plotter.add_ebands(label="BZ sampling", bands="si_scf_GSR.nc")
#plotter.add_ebands(label="k-path", bands="si_nscf_GSR.nc")
#plotter.gridplot(with_gaps=True);

How to run.¶

In order to run a Flow, we need two configurations files:¶

  1. scheduler.yml providing:

    • scheduler parameters such as days, hours, minutes, max_njobs_inqueue, max_ncores_used, …
  1. manager.yml providing:

    • list of shell commands to be executed before running ABINIT
    • list of modules to load
    • options for the queue manager (bluegene, moab, pbspro, sge, shell, slurm, torque)

See this page for examples or use the abidoc.py script and the syntax:

  • abidoc.py scheduler
  • abidoc.py manager
  • abidoc.py manager slurm

How to run calculations?¶

  • The simplest way to start the scheduler from the shell is via the syntax:
run_elastic.py --scheduler # -s for the short option
  • For non-trivial Flows, we suggest to put the scheduler in background and use nohup so that we can disconnect from the shell session without killing the scheduler.
nohup run_elastic.py --s > log 2> err &
  • Obviously, it is also possible to submit a Slurm script to execute the script on the compute note with 1 core.